130 research outputs found
There is a VaR beyond usual approximations
Basel II and Solvency 2 both use the Value-at-Risk (VaR) as the risk measure
to compute the Capital Requirements. In practice, to calibrate the VaR, a
normal approximation is often chosen for the unknown distribution of the yearly
log returns of financial assets. This is usually justified by the use of the
Central Limit Theorem (CLT), when assuming aggregation of independent and
identically distributed (iid) observations in the portfolio model. Such a
choice of modeling, in particular using light tail distributions, has proven
during the crisis of 2008/2009 to be an inadequate approximation when dealing
with the presence of extreme returns; as a consequence, it leads to a gross
underestimation of the risks. The main objective of our study is to obtain the
most accurate evaluations of the aggregated risks distribution and risk
measures when working on financial or insurance data under the presence of
heavy tail and to provide practical solutions for accurately estimating high
quantiles of aggregated risks. We explore a new method, called Normex, to
handle this problem numerically as well as theoretically, based on properties
of upper order statistics. Normex provides accurate results, only weakly
dependent upon the sample size and the tail index. We compare it with existing
methods.Comment: 33 pages, 5 figure
Level crossings and other level functionals of stationary Gaussian processes
This paper presents a synthesis on the mathematical work done on level
crossings of stationary Gaussian processes, with some extensions. The main
results [(factorial) moments, representation into the Wiener Chaos, asymptotic
results, rate of convergence, local time and number of crossings] are
described, as well as the different approaches [normal comparison method, Rice
method, Stein-Chen method, a general -dependent method] used to obtain them;
these methods are also very useful in the general context of Gaussian fields.
Finally some extensions [time occupation functionals, number of maxima in an
interval, process indexed by a bidimensional set] are proposed, illustrating
the generality of the methods. A large inventory of papers and books on the
subject ends the survey.Comment: Published at http://dx.doi.org/10.1214/154957806000000087 in the
Probability Surveys (http://www.i-journals.org/ps/) by the Institute of
Mathematical Statistics (http://www.imstat.org
On the capacity functional of excursion sets of Gaussian random fields on
When a random field is thresholded on a given
level , the excursion set is given by its indicator .
The purpose of this work is to study functionals (as established in stochastic
geometry) of these random excursion sets, as e.g. the capacity functional as
well as the second moment measure of the boundary length. It extends results
obtained for the one-dimensional case to the two-dimensional case, with tools
borrowed from crossings theory, in particular Rice methods, and from integral
and stochastic geometry.Comment: 19 page
The impact of systemic risk on the diversification benefits of a risk portfolio
Risk diversification is the basis of insurance and investment. It is thus
crucial to study the effects that could limit it. One of them is the existence
of systemic risk that affects all the policies at the same time. We introduce
here a probabilistic approach to examine the consequences of its presence on
the risk loading of the premium of a portfolio of insurance policies. This
approach could be easily generalized for investment risk. We see that, even
with a small probability of occurrence, systemic risk can reduce dramatically
the diversification benefits. It is clearly revealed via a non-diversifiable
term that appears in the analytical expression of the variance of our models.
We propose two ways of introducing it and discuss their advantages and
limitations. By using both VaR and TVaR to compute the loading, we see that
only the latter captures the full effect of systemic risk when its probability
to occur is lowComment: 17 pages, 5 tableau
On the second moment of the number of crossings by a stationary Gaussian process
Cram\'{e}r and Leadbetter introduced in 1967 the sufficient condition
to have a
finite variance of the number of zeros of a centered stationary Gaussian
process with twice differentiable covariance function . This condition is
known as the Geman condition, since Geman proved in 1972 that it was also a
necessary condition. Up to now no such criterion was known for counts of
crossings of a level other than the mean. This paper shows that the Geman
condition is still sufficient and necessary to have a finite variance of the
number of any fixed level crossings. For the generalization to the number of a
curve crossings, a condition on the curve has to be added to the Geman
condition.Comment: Published at http://dx.doi.org/10.1214/009117906000000142 in the
Annals of Probability (http://www.imstat.org/aop/) by the Institute of
Mathematical Statistics (http://www.imstat.org
What is the best risk measure in practice? A comparison of standard measures
Expected Shortfall (ES) has been widely accepted as a risk measure that is
conceptually superior to Value-at-Risk (VaR). At the same time, however, it has
been criticised for issues relating to backtesting. In particular, ES has been
found not to be elicitable which means that backtesting for ES is less
straightforward than, e.g., backtesting for VaR. Expectiles have been suggested
as potentially better alternatives to both ES and VaR. In this paper, we
revisit commonly accepted desirable properties of risk measures like coherence,
comonotonic additivity, robustness and elicitability. We check VaR, ES and
Expectiles with regard to whether or not they enjoy these properties, with
particular emphasis on Expectiles. We also consider their impact on capital
allocation, an important issue in risk management. We find that, despite the
caveats that apply to the estimation and backtesting of ES, it can be
considered a good risk measure. As a consequence, there is no sufficient
evidence to justify an all-inclusive replacement of ES by Expectiles in
applications. For backtesting ES, we propose an empirical approach that
consists in replacing ES by a set of four quantiles, which should allow to make
use of backtesting methods for VaR.
Keywords: Backtesting; capital allocation; coherence; diversification;
elicitability; expected shortfall; expectile; forecasts; probability integral
transform (PIT); risk measure; risk management; robustness; value-at-riskComment: 27 pages, 1 tabl
- …